1,959 research outputs found

    Pedagogically informed metadata content and structure for learning and teaching

    No full text
    In order to be able to search, compare, gap analyse, recommend, and visualise learning objects, learning resources, or teaching assets, the metadata structure and content must be able to support pedagogically informed reasoning, inference, and machine processing over the knowledge representations. In this paper, we present the difficulties with current metadata standards in education: Dublin Core educational version and IEEELOM, using examples drawn from the areas of e-learning, institutional admissions, and learners seeking courses. The paper suggests expanded metadata components based on an e-learning system engineering model to support pedagogically informed interoperability. We illustrate some examples of the metadata relevant to competency in the nurse training domain

    Validation of Serious Games Attributes Using the Technology Acceptance Model

    No full text
    The paper introduces a conceptual model for the design of serious games and uses the Technology Acceptance Model (TAM) for its validation. A specially developed game introduced international students to public transport in Southampton. After completing the game, participants completed a short questionnaire and the data was analysed using structural equation modelling (SEM). The results identified the attributes and combinations of attributes that led the learner to accept and to use the serious game for learning. These findings are relevant in helping game designers and educational practitioners design serious games for effective learning

    Technology enhanced interaction framework

    No full text
    This paper focuses on the development of a general interaction framework to help design technology to support communication between people and improve interactions between people, technology and objects, particularly in complex situations. A review of existing interaction frameworks shows that none of them help technology designers and developers to consider all of the possible interactions that occur at the same time and in the same place. The main and sub-components of the framework are described and explained and examples are given for each type of interaction. Work is now in progress to provide designers with an easy to use tool that helps them apply the framework to create technology solutions to complex communication and interaction problems and situations

    An evaluation of pedagogically informed parameterised questions for self assessment

    No full text
    Self-assessment is a crucial component of learning. Learners can learn by asking themselves questions and attempting to answer them. However, creating effective questions is time-consuming because it may require considerable resources and the skill of critical thinking. Questions need careful construction to accurately represent the intended learning outcome and the subject matter involved. There are very few systems currently available which generate questions automatically, and these are confined to specific domains. This paper presents a system for automatically generating questions from a competency framework, based on a sound pedagogical and technological approach. This makes it possible to guide learners in developing questions for themselves, and to provide authoring templates which speed the creation of new questions for self-assessment. This novel design and implementation involves an ontological database that represents the intended learning outcome to be assessed across a number of dimensions, including level of cognitive ability and subject matter. The system generates a list of all the questions that are possible from a given learning outcome, which may then be used to test for understanding, and so could determine the degree to which learners actually acquire the desired knowledge. The way in which the system has been designed and evaluated is discussed, along with its educational benefits

    Issues in conducting expert validation and review and user evaluation of the technology enhanced interaction framework and method

    No full text
    A Technology Enhanced Interaction Framework has been developed to support designers and developers design and develop technology enhanced interactions for complex scenarios involving disabled people. Issues of motivation, time, and understanding when validating and evaluating the Technology Enhanced Interaction Framework were identified through a literature review and questionnaires and interviews with experts. Changes to content, system, and approach were made in order to address the identified issues. Future work will involve detailed analysis of the expert review and validation findings and the implementation of a motivating approach to user evaluation

    Using the technology enhanced interaction framework for interaction scenarios involving disabled people

    No full text
    This paper focuses on the development of a general interaction framework to help design technology to support communication between people and improve interactions between people, technology and objects, particularly in complex situations when disabled people are involved. The main and sub-components of the framework are described. A tool was developed to provide advice on design and development factors for technological support. Work is now in progress to validate the framework and the tool with expert designers and accessibility experts before evaluating it with technology designers

    Towards a competency model for adaptive assessment to support lifelong learning

    No full text
    Adaptive assessment provides efficient and personalised routes to establishing the proficiencies of learners. We can envisage a future in which learners are able to maintain and expose their competency profile to multiple services, throughout their life, which will use the competency information in the model to personalise assessment. Current competency standards tend to over simplify the representation of competency and the knowledge domain. This paper presents a competency model for evaluating learned capability by considering achieved competencies to support adaptive assessment for lifelong learning. This model provides a multidimensional view of competencies and provides for interoperability between systems as the learner progresses through life. The proposed competency model is being developed and implemented in the JISC-funded Placement Learning and Assessment Toolkit (mPLAT) project at the University of Southampton. This project which takes a Service-Oriented approach will contribute to the JISC community by adding mobile assessment tools to the E-framework
    corecore